k-nearest neighbors directed noise injection in multilayer perceptron training
نویسندگان
چکیده
منابع مشابه
K-nearest Neighbors Directed Noise Injection in Multilayer Perceptron Training
The relation between classifier complexity and learning set size is very important in discriminant analysis. One of the ways to overcome the complexity control problem is to add noise to the training objects, increasing in this way the size of the training set. Both the amount and the directions of noise injection are important factors which determine the effectiveness for classifier training. ...
متن کاملK-Nearest Neighbours Directed Noise Injection in Multilayer Perceptron Training
Training M. Skurichina1, .Raudys2 and R.P.W. Duin1 1Pattern Recognition Group, Department of Applied Physics, Delft University of Technology, P.O. Box 5046, 2600GA Delft, The Netherlands. E-mail: [email protected], [email protected] 2Department of Data Analysis, Institute of Mathematics and Informatics, Akademijos 4, Vilnius 2600, Lithuania. Email: [email protected] Abstract T...
متن کاملIEEE TNN A172Rev K Nearest Neighbours Directed Noise Injection in Multilayer Perceptron Training
IEEE TNN A172Rev K Nearest Neighbours Directed Noise Injection in Multilayer Perceptron Training M. Skurichina1, .Raudys2 and R.P.W. Duin1 1Pattern Recognition Group, Department of Applied Physics, Delft University of Technology, P.O. Box 5046, 2600GA Delft, The Netherlands. E-mail: [email protected], [email protected] 2Department of Data Analysis, Institute of Mathematics and Informa...
متن کاملMultilayer Perceptron Training
In this contribution we present an algorithm for using possibly inaccurate knowledge of model derivatives as a part of the training data for a multilayer perceptron network (MLP). In many practical process control problems there are many well-known rules about the eeect of control variables to the target variables. With the presented algorithm the basically data driven neural network model can ...
متن کاملApproximate K Nearest Neighbors in High Dimensions
Given a set P of N points in a ddimensional space, along with a query point q, it is often desirable to find k points of P that are with high probability close to q. This is the Approximate k-NearestNeighbors problem. We present two algorithms for AkNN. Both require O(Nd) preprocessing time. The first algorithm has a query time cost that is O(d+logN), while the second has a query time cost that...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Neural Networks
سال: 2000
ISSN: 1045-9227
DOI: 10.1109/72.839019